85 research outputs found

    Tracking interacting dust: comparison of tracking and state estimation techniques for dusty plasmas

    Full text link
    When tracking a target particle that is interacting with nearest neighbors in a known way, positional data of the neighbors can be used to improve the state estimate. Effects of the accuracy of such positional data on the target track accuracy are investigated in this paper, in the context of dusty plasmas. In kinematic simulations, notable improvement in the target track accuracy was found when including all nearest neighbors in the state estimation filter and tracking algorithm, whereas the track accuracy was not significantly improved by higher-accuracy measurement techniques. The state estimation algorithm, involving an extended Kalman filter, was shown to either remove or significantly reduce errors due to "pixel locking". It is concluded that the significant extra complexity and computational expense to achieve these relatively small improvements are likely to be unwarranted for many situations. For the purposes of determining the precise particle locations, it is concluded that the simplified state estimation algorithm can be a viable alternative to using more computationally-intensive measurement techniques.Comment: 11 pages, 6 figures, Conference paper: Signal and Data Processing of Small Targets 2010 (SPIE

    Ideal gas behavior of a strongly-coupled complex (dusty) plasma

    Get PDF
    In a laboratory, a two-dimensional complex (dusty) plasma consists of a low-density ionized gas containing a confined suspension of Yukawa-coupled plastic microspheres. For an initial crystal-like form, we report ideal gas behavior in this strongly-coupled system during shock-wave experiments. This evidence supports the use of the ideal gas law as the equation of state for soft crystals such as those formed by dusty plasmas.Comment: 5 pages, 5 figures, 5 authors, published versio

    Afterlife: the post-research affect and effect of software

    Get PDF
    Software plays an important role in contemporary research. Aside from its use for administering traditional instruments like surveys and in data analysis, the widespread use of mobile and web apps for social, medical and lifestyle engagement has led to software becoming a research intervention in its own right. For example, it is not unusual to find apps being studied for their utility as interventions in health and social life. Since the software may persist in use beyond the life of an investigation, this raises questions as to the extent of ethical duties for researchers involved in its production and/or study towards the participants involved. Key factors identified include the extent of affect created by the software, the effect it has on a participant’s life, the length of investigation, cost of maintenance and participant agency. In this article we discuss the issues raised in such situations, considering them in the context of post-research duties of care and suggesting strategies to balance the burden on researchers with the need for ongoing participant support

    Composite SUVR: a new method for boosting Alzheimer's disease monitoring and diagnostic performance, applied to tau PET

    Get PDF
    Background: Abnormal brain tau protein accumulation is strongly linked to multiple neurodegenerative disorders. Currently, brain tau pathology is quantified in vivo using tau PET by calculating the Standardized Uptake Value Ratio (SUVR) of target and reference regions of interest (ROIs). Recent work (Schwarz et al., 2021) in Alzheimer’s Disease (AD) explored various target and reference ROIs to report performance of SUVR as a biomarker for diagnosis, disease monitoring, and clinical trial efficacy/eligibility (sample size estimate, SSE). Here we introduce a new method and biomarker: Composite SUVR (CUVR). / Methods: We analyzed longitudinal SUV data from ADNI in the available 103 participants having three or more tau PET scans ([18F]AV-1451): 58 cognitively normal (CN); 21 mild cognitive impairment; 24 probable AD. In the spirit of SUVR and statistical ROIs (Chen, et al., NeuroImage 2010), we calculate CUVR as the SUV ratio of two composite regions. Our novel method is that the composite regions are determined by a genetic algorithm that searches the possible 3^96 combinations of regions from FreeSurfer’s default atlas. We compare performance of SUVR with CUVR. Performance metrics follow Schwarz et al.: a linear mixed-effects model quantifies longitudinal group separation by tau accumulation rate (t statistic between fixed effects for CN and AD) and longitudinal precision (model residuals’ standard deviation). CUVR and SUVR values were log-transformed before model fitting. We calculated SSE for a hypothetical clinical trial designed for 80% power to reduce tau PET accumulation by 20% (vs. placebo) in non-CN individuals. / Results: Our method identified a CUVR biomarker involving 60 regions. Figure-1 shows the vast performance improvement of CUVR versus the best-performing SUVR (inferior-temporal target; eroded subcortical white matter reference). Group separation improved by 2.9x (t = 9.57 vs 3.32); longitudinal precision by 6.5x (residual std = 0.331% vs 2.14%); and CUVR required a smaller sample size by 3.9x (83 vs 318). / Conclusions: Our simple data-driven approach discovered a new tau PET biomarker called CUVR. Experimental results show state-of-the-art longitudinal group separation, longitudinal precision, and clinical trial enrichment. The remarkable performance improvements provide compelling evidence for using CUVR for both eligibility and efficacy in Alzheimer’s disease clinical trials, particularly of anti-tau therapies

    Learning To Pay Attention To Mistakes

    Get PDF
    In convolutional neural network based medical image segmentation, the periphery of foreground regions representing malignant tissues may be disproportionately assigned as belonging to the background class of healthy tissues \cite{attenUnet}\cite{AttenUnet2018}\cite{InterSeg}\cite{UnetFrontNeuro}\cite{LearnActiveContour}. This leads to high false negative detection rates. In this paper, we propose a novel attention mechanism to directly address such high false negative rates, called Paying Attention to Mistakes. Our attention mechanism steers the models towards false positive identification, which counters the existing bias towards false negatives. The proposed mechanism has two complementary implementations: (a) "explicit" steering of the model to attend to a larger Effective Receptive Field on the foreground areas; (b) "implicit" steering towards false positives, by attending to a smaller Effective Receptive Field on the background areas. We validated our methods on three tasks: 1) binary dense prediction between vehicles and the background using CityScapes; 2) Enhanced Tumour Core segmentation with multi-modal MRI scans in BRATS2018; 3) segmenting stroke lesions using ultrasound images in ISLES2018. We compared our methods with state-of-the-art attention mechanisms in medical imaging, including self-attention, spatial-attention and spatial-channel mixed attention. Across all of the three different tasks, our models consistently outperform the baseline models in Intersection over Union (IoU) and/or Hausdorff Distance (HD). For instance, in the second task, the "explicit" implementation of our mechanism reduces the HD of the best baseline by more than 26%26\%, whilst improving the IoU by more than 3%3\%. We believe our proposed attention mechanism can benefit a wide range of medical and computer vision tasks, which suffer from over-detection of background.Comment: Accepted at BMVC 202

    Model for monitoring of a charge qubit using a radio-frequency quantum point contact including experimental imperfections

    Full text link
    The extension of quantum trajectory theory to incorporate realistic imperfections in the measurement of solid-state qubits is important for quantum computation, particularly for the purposes of state preparation and error-correction as well as for readout of computations. Previously this has been achieved for low-frequency (dc) weak measurements. In this paper we extend realistic quantum trajectory theory to include radio frequency (rf) weak measurements where a low-transparency quantum point contact (QPC), coupled to a charge qubit, is used to damp a classical oscillator circuit. The resulting realistic quantum trajectory equation must be solved numerically. We present an analytical result for the limit of large dissipation within the oscillator (relative to the QPC), where the oscillator slaves to the qubit. The rf+dc mode of operation is considered. Here the QPC is biased (dc) as well as subjected to a small-amplitude sinusoidal carrier signal (rf). The rf+dc QPC is shown to be a low-efficiency charge-qubit detector, that may nevertheless be higher than the dc-QPC (which is subject to 1/f noise).Comment: 12 pages, 2 colour figures. v3 is published version (minor changes since v2

    Multi view based imaging genetics analysis on Parkinson disease

    Get PDF
    Longitudinal studies integrating imaging and genetic data have recently become widespread among bioinformatics researchers. Combining such heterogeneous data allows a better understanding of complex diseases origins and causes. Through a multi-view based workflow proposal, we show the common steps and tools used in imaging genetics analysis, interpolating genotyping, neuroimaging and transcriptomic data. We describe the advantages of existing methods to analyze heterogeneous datasets, using Parkinson\u2019s Disease (PD) as a case study. Parkinson's disease is associated with both genetic and neuroimaging factors, however such imaging genetics associations are at an early investigation stage. Therefore it is desirable to have a free and open source workflow that integrates different analysis flows in order to recover potential genetic biomarkers in PD, as in other complex diseases

    Targeted Screening for Alzheimer's Disease Clinical Trials Using Data-Driven Disease Progression Models

    Get PDF
    Heterogeneity in Alzheimer's disease progression contributes to the ongoing failure to demonstrate efficacy of putative disease-modifying therapeutics that have been trialed over the past two decades. Any treatment effect present in a subgroup of trial participants (responders) can be diluted by non-responders who ideally should have been screened out of the trial. How to identify (screen-in) the most likely potential responders is an important question that is still without an answer. Here, we pilot a computational screening tool that leverages recent advances in data-driven disease progression modeling to improve stratification. This aims to increase the sensitivity to treatment effect by screening out non-responders, which will ultimately reduce the size, duration, and cost of a clinical trial. We demonstrate the concept of such a computational screening tool by retrospectively analyzing a completed double-blind clinical trial of donepezil in people with amnestic mild cognitive impairment (clinicaltrials.gov: NCT00000173), identifying a data-driven subgroup having more severe cognitive impairment who showed clearer treatment response than observed for the full cohort
    • …
    corecore